Locally Weighted Bayesian Regression

نویسنده

  • Andrew W. Moore
چکیده

1 The Problem Suppose we have a dataset with N datapoints. Each datapoint consists of a vector of inputs and a real valued-output, so the dataset is x0 ; y0 x1 ; y1 .. xN 1 ; yN 1 The inputs need not be real-valued. All we require of them is a distance metric measuring the similarity of a pair of input vectors Dist : x;x0 ! < (1) and a set of M basis functions 0 : x! <; 1 : x! <; : : : M 1 : x! < (2) Write z(x) = ( 0(x); 1(x); : : : ; M 1(x)). And then write zi = z(xi).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Family of Geographically Weighted Regression Models

A Bayesian treatment of locally linear regression methods introduced in McMillen (1996) and labeled geographically weighted regressions (GWR) in Brunsdon, Fotheringham and Charlton (1996) is set forth in this paper. GWR uses distance-decay-weighted sub-samples of the data to produce locally linear estimates for every point in space. While the use of locally linear regression represents a true c...

متن کامل

A Bayesian Nominal Regression Model with Random Effects for Analysing Tehran Labor Force Survey Data

Large survey data are often accompanied by sampling weights that reflect the inequality probabilities for selecting samples in complex sampling. Sampling weights act as an expansion factor that, by scaling the subjects, turns the sample into a representative of the community. The quasi-maximum likelihood method is one of the approaches for considering sampling weights in the frequentist framewo...

متن کامل

5 Approximate Nearest Neighbor Regression in Very High Dimensions

Fast and approximate nearest-neighbor search methods have recently become popular for scaling nonparameteric regression to more complex and high-dimensional applications. As an alternative to fast nearest neighbor search, training data can also be incorporated online into appropriate sufficient statistics and adaptive data structures, such that approximate nearestneighbor predictions can be acc...

متن کامل

Kernel Carpentry for Online Regression Using Randomly Varying Coefficient Model

We present a Bayesian formulation of locally weighted learning (LWL) using the novel concept of a randomly varying coefficient model. Based on this, we propose a mechanism for multivariate non-linear regression using spatially localised linear models that learns completely independent of each other, uses only local information and adapts the local model complexity in a data driven fashion. We d...

متن کامل

Efficient Locally Weighted Polynomial Regression Predictions

Locally weighted polynomial regression (LWPR) is a popular instance-based algorithm for learning continuous non-linear mappings. For more than two or three inputs and for more than a few thousand datapoints the computational expense of predictions is daunting. We discuss drawbacks with previous approaches to dealing with this problem, and present a new algorithm based on a multiresolution searc...

متن کامل

Bayesian Backfitting

Whenever a graphical model contains connections from multiple nodes to a single node, statistical inference of model parameters may require the evaluation and possibly the inversion of the covariance matrix of all variables contributing to such a fan-in, particularly in the context of regression and classification. Thus, for high dimensional fan-ins, statistical inference can become computation...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995